1
From Constants to Random Variables: The Bayesian Paradigm
MATH003 Lesson 7
00:00

The fundamental shift in the Bayesian paradigm lies in the ontological status of the unknown parameter $\theta$. Unlike frequentist statistics, which treats $\theta$ as a fixed but unknown constant, the Bayesian approach treats $\theta$ as a random variable. This allows us to quantify uncertainty through a prior probability measure $\Pi$.

The Bayesian Model Construction

A complete Bayesian model is defined by the pair $(\{f_{\theta} : \theta \in \Omega\}, \Pi)$. Bayesian inference is not merely "using Bayes' Theorem," but the deliberate act of adding a prior probability distribution to the sampling model as an essential ingredient for inference.

The Joint Distribution

The total state of our knowledge is captured by the joint distribution $\pi(\theta) f_{\theta}(s)$. This function links the observed data $s$ and the unobserved parameter $\theta$ in a single coherent probabilistic framework.

Direct Probability Statements

In this paradigm, $\theta$ is governed by a probability density $\pi(\theta)$. This allows us to make direct probability statements about the parameter, such as $P(\theta \in A)$. This is logically impossible in a frequentist framework, where $\theta$ has no distribution and thus such statements are undefined.

⚠️ Critical Pitfall: The Posterior Axiom
Note that choosing to use the posterior distribution for probability statements about $\theta$ is an axiom, or principle, of the Bayesian school—not a theorem derived from more basic statistical truths. We assume that the posterior represents our updated state of rational belief.

Real-World Analogy: Medical Diagnostics

In diagnostics for a rare disease, the "constant" is whether a patient has the disease. In the Bayesian paradigm, we treat the disease status $(\theta)$ as a random variable. If the prevalence is 0.1% (the prior), and a test (the model $f_{\theta}$) returns positive, we do not just look at the test's accuracy; we look at the joint probability of having the disease AND testing positive to determine the new probability of illness.

🎯 Core Principle
Bayesian inference adds the prior probability distribution to the sampling model for the data as an additional ingredient to be used in determining inferences about the unknown value of the parameter.